Goto

Collaborating Authors

 ai-enabled application


The 'AI tax' on AI-enabled applications in the cloud

#artificialintelligence

Back in 2019, I wrote about the "container tax." In simple terms, this is the additional cost to use containers properly within a cloud-based application. It includes development, operations, and other expenses that containers incur. The goal of leveraging containers is to offset the additional costs with the benefits they offer. Many other technologies come with additional costs, which may or may not justify using that specific technology.

  Genre: Play > Prospect (1.00)
  Industry:

Finance Companies Ramp Up AI Deployment

#artificialintelligence

In the financial services industry, banks, insurers, asset managers and fintech companies are increasing the speed at which they deploy artificial intelligence (AI)-enabled applications, confident that AI will help them assess risk more accurately, enable operational efficiencies, and reduce costs, results from a new study by American tech firm Nvidia show. The 2023 State of AI in Financial Services report, released on February 02, 2023, draws on a survey of nearly 500 global financial services professionals that sought to understand AI trends in the sector, as well as the opportunities perceived and challenges faced by the industry. Results from the study show that the adoption of AI in the finance sector is accelerating at a fast pace, with over half of the respondents indicating having deployed three or more of the 21 different AI-enabled use cases analyzed by the survey. A fifth of respondents said they had six or more use cases in market. Accelerated adoption of AI in the sector comes on the back of increased awareness of the imperative among executive leadership teams.


The AI Tech-Stack Model

#artificialintelligence

Presently, enterprises have implemented advanced artificial intelligence (AI) technologies to support business process automation (BPA), provide valuable data insights, and facilitate employee and customer engagement.7 However, developing and deploying new AI-enabled applications poses some management and technology challenges.3,5,12,15 Management challenges include identifying appropriate business use cases for AI-enabled applications, lack of expertise in applying advanced AI technologies, and insufficient funding. Concerning technology challenges, organizations continuously encounter obsolete, incumbent information technology (IT)/information systems (IS) facilities; difficulty and complexity integrating new AI projects into existing IT/IS processes; immature and underdeveloped AI infrastructure; inadequate data quantity and poor-quality learning requirements; growing security problems/threats; and inefficient data preprocessing assistance. Furthermore, major cloud service vendors (for example, Amazon, Google, and Microsoft) and third-party vendors (for instance, Salesforce and Sense-Time) have stepped up efforts as major players in the AI-as-a-service (AIaaS) race by integrating cloud services with AI core components (for example, enormous amounts of data, advanced learning algorithms, and powerful computing hardware).4 Although AIaaS offerings allow companies to leverage AI power without investing massive resources from scratch,8 numerous issues have emerged to hinder the development of desired AI systems. For example, current AI offerings are recognized as a fully bundled package, offering less interoperability between different vendors and causing vendor lock-in and proprietary concerns.


How artificial intelligence is making diagnosis faster and more accurate

#artificialintelligence

AI is helping to diagnose tuberculosis and cancer more accurately. Reports suggest that there was a 33 per cent increase in the notification rate, and the number of drop-outs of presumptive cases fell from 72 per cent to 53 per cent. The applications and use-cases are innumerable, and increasingly, the Indian health care scene is warming to AI-based diagnoses and interventions. Nilesh Shah, president and chief of science and innovation at Metropolis Healthcare, says that several common tests such as complete blood count and even tests for autoimmune disorders can be done quickly and without errors using AI-enabled applications. "The share of AI in diagnosis is only going to go up in future, and this will reduce the burden on doctors, who can devote their time to complex cases. Also, there is a huge cost benefit," says Shah.


AI in banking: from the innovation lab to production

#artificialintelligence

Some 80% of the world's top financial firms are spending billions on artificial intelligence to improve their services and compete with each other. New research from NVIDIA uncovers what those firms are doing and how they're deploying these resources. Competition for consumers and their financial data continues to intensify across incumbent banks, fintech, big tech, and big-box retail. This is compounded by highly innovative digital experiences being deployed across industries, which continue to shift consumer expectations. Kevin Levitt, head of NVIDIA Financial Services says that financial services companies must enhance the level of personalisation, data security, customer service, pricing, and more in the creation and delivery of financial products or expect to lose market share to those who do.


Which Design Decisions in AI-enabled Mobile Applications Contribute to Greener AI?

Castanyer, Roger Creus, Martínez-Fernández, Silverio, Franch, Xavier

arXiv.org Artificial Intelligence

Background: The construction, evolution and usage of complex artificial intelligence (AI) models demand expensive computational resources. While currently available high-performance computing environments support well this complexity, the deployment of AI models in mobile devices, which is an increasing trend, is challenging. Mobile applications consist of environments with low computational resources and hence imply limitations in the design decisions during the AI-enabled software engineering lifecycle that balance the trade-off between the accuracy and the complexity of the mobile applications. Objective: Our objective is to systematically assess the trade-off between accuracy and complexity when deploying complex AI models (e.g. neural networks) to mobile devices, which have an implicit resource limitation. We aim to cover (i) the impact of the design decisions on the achievement of high-accuracy and low resource-consumption implementations; and (ii) the validation of profiling tools for systematically promoting greener AI. Method: This confirmatory registered report consists of a plan to conduct an empirical study to quantify the implications of the design decisions on AI-enabled applications performance and to report experiences of the end-to-end AI-enabled software engineering lifecycle. Concretely, we will implement both image-based and language-based neural networks in mobile applications to solve multiple image classification and text classification problems on different benchmark datasets. Overall, we plan to model the accuracy and complexity of AI-enabled applications in operation with respect to their design decisions and will provide tools for allowing practitioners to gain consciousness of the quantitative relationship between the design decisions and the green characteristics of study.


AI in healthcare: The tech is here, the users are not

#artificialintelligence

Since the beginning of the year, there has been a significant uptick across health plans, healthcare providers, and analytics firms utilizing AI to change how healthcare is delivered and how patients can be more engaged in their care. AI applications in healthcare are all over the map today. Data from my firm's digital health intelligence database, DamoIntelTM, has identified a significant rise in the launch of AI use cases across clinical and administrative areas in 2020. An analysis of AI/ML applications deployed by the top 50 health systems across the United States identifies that AI-enabled solutions fall into multiple technology categories: machine learning, natural language processing (NLP), conversational interfaces such as chatbots, and robotic process automation (RPA). COVID-related use cases in clinical and administrative areas contributed to the growth in adoption of newer technologies such as chatbots in healthcare.


Council Post: ModelOps To The Rescue: What Your AI Has Been Missing

#artificialintelligence

ModelOps has swooped in to make artificial intelligence (AI) accessible by anyone anywhere. Faced with staggering stats like the fact that "On average, organizations take nine months to develop AI initiatives from prototype to production" and "as of 2018 only 47% of all AI investments make it out of the lab," data scientists and developers needed an easier way to take models from their favorite machine learning (ML) workbenches to running them at scale in production. If you're not working in the data science or the AI industry, ModelOps is probably a new term. It's a relatively new technical field, so you're likely to find differing opinions which can create confusion. Therefore, it's necessary to clarify what is meant by ModelOps.


Data Annotation Feeds the AI Beast - RTInsights

#artificialintelligence

The demand for AI-enabled applications that deliver increasingly refined results is driving the need for high-quality annotated data to train AI models. Many continuous intelligence (CI) applications need trained AI models to work. An autonomous vehicle relies on sample data sets that help it differentiate objects and identify road markings and traffic signs. Similarly, an automated video surveillance system needs a data set to learn how to distinguish between a raccoon and an intruder. If the quality of that training data is not right, the performance of the AI models will not be satisfactory.


Who Is The Leader In AI Hardware?

#artificialintelligence

A few months ago, I published a blog that highlighted Qualcomm's plans to enter the data center market with the Cloud AI100 chip sometime next year. While preparing the blog, our founder and principal analyst, Patrick Moorhead, called to point out that Qualcomm, not NVIDIA, probably has the largest market share in AI chip volume thanks to its leadership in devices for smartphones. Turns out, we were both right; it just depends on what you are counting. In the mobile and embedded space, Qualcomm powers hundreds of consumer and embedded devices running AI; it has shipped well over one billion Snapdragons and counting, all which support some level of AI today. In the data center, however, NVIDIA likely has well over 90% share of the market for training.